Real-time human pose estimation on a smart walker using convolutional neural networks
نویسندگان
چکیده
Rehabilitation is important to improve quality of life for mobility-impaired patients. Smart walkers are a commonly used solution that should embed automatic and objective tools data-driven human-in-the-loop control monitoring. However, present solutions focus on extracting few specific metrics from dedicated sensors with no unified full-body approach. We investigate general, real-time, pose estimation framework based two RGB+D camera streams non-overlapping views mounted smart walker equipment in rehabilitation. Human keypoint performed using two-stage neural network framework. The 2D-Stage implements detection module locates body keypoints the 2D image frames. 3D-Stage regression lifts relates detected both cameras 3D space relative walker. Model predictions low-pass filtered temporal consistency. A custom acquisition method was obtain dataset, 14 healthy subjects, training evaluating proposed offline, which then deployed real equipment. An overall error 3.73 pixels 44.05mm were reported, an inference time 26.6ms when constrained hardware novel approach patient monitoring context walkers. It able extract complete compact representation real-time inexpensive sensors, serving as common base downstream extraction solutions, Human-Robot interaction applications. Despite promising results, more data be collected users impairments, assess its performance rehabilitation tool real-world scenarios.
منابع مشابه
Real-time Human Pose Estimation with Convolutional Neural Networks
In this project, we seek to develop an accurate and efficient methodology to address the challenge of real-time head pose estimation. Though there have been previous attempts to apply convolutional neural networks to this fundamentally subtle task, to the best of our knowledge, there exist no initiatives that have directly utilized Convolutional Neural Networks (CNNs) in tandem with these metho...
متن کاملHead Pose Estimation Using Convolutional Neural Networks
Detection and estimation of head pose is fundamental problem in many applications such as automatic face recognition, intelligent surveillance, and perceptual human-computer interface and in an application like driving, the pose of the driver is used to estimate his gaze and alertness, where faces in the images are non-frontal with various poses. In this work head pose of the person is used to ...
متن کاملHead pose Estimation Using Convolutional Neural Networks
Head pose estimation is a fundamental problem in computer vision. Several methods has been proposed to solve this problem. Most existing methods use traditional computer vision methods and existing method of using neural networks works on depth bitmaps. In this project, we explore using convolutional neural networks (CNNs) that take RGB image as input to estimate the head pose. We use regressio...
متن کاملHuman Pose Estimation and Activity Classification Using Convolutional Neural Networks
In this paper, we investigate the problems of human pose estimation and activity classification using a deep learning approach. We constructed a CNN to address the regression problem of human joint location estimation, and achieved a PDJ score of about 60%. Furthermore, using weight initializations from an AlexNet trained to classify on ImageNet, we trained a deep convolutional neural network (...
متن کاملReal-time Human Pose Estimation from Video with Convolutional Neural Networks
In this paper, we present a method for real-time multi-person human pose estimation from video by utilizing convolutional neural networks. Our method is aimed for use case specific applications, where good accuracy is essential and variation of the background and poses is limited. This enables us to use a generic network architecture, which is both accurate and fast. We divide the problem into ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Expert Systems With Applications
سال: 2021
ISSN: ['1873-6793', '0957-4174']
DOI: https://doi.org/10.1016/j.eswa.2021.115498